Learning Monotonic Linear Functions

نویسنده

  • Adam Tauman Kalai
چکیده

Learning probabilities (p-concepts [13]) and other real-valued concepts (regression) is an important role of machine learning. For example, a doctor may need to predict the probability of getting a disease P [y|x], which depends on a number of risk factors. Generalized additive models [9] are a well-studied nonparametric model in the statistics literature, usually with monotonic link functions. However, no known efficient algorithms exist for learning such a general class. We show that regression graphs efficiently learn such real-valued concepts, while regression trees inefficiently learn them. One corollary is that any function E[y|x] = u(w · x) for u monotonic can be learned to arbitrarily small squared error in time polynomial in 1/ , |w|1, and the Lipschitz constant of u (analogous to a margin). The model includes, as special cases, linear and logistic regression, as well as learning a noisy half-space with a margin [5, 4]. Kearns, Mansour, and McAllester [12, 15], analyzed decision trees and decision graphs as boosting algorithms for classification accuracy. We extend their analysis and the boosting analogy to the case of real-valued predictors, where a small positive correlation coefficient can be boosted to arbitrary accuracy. Viewed as a noisy boosting algorithm [3, 10], the algorithm learns both the target function and the asymmetric noise.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Deep Lattice Networks and Partial Monotonic Functions

We propose learning deep models that are monotonic with respect to a userspecified set of inputs by alternating layers of linear embeddings, ensembles of lattices, and calibrators (piecewise linear functions), with appropriate constraints for monotonicity, and jointly training the resulting network. We implement the layers and projections with new computational graph nodes in TensorFlow and use...

متن کامل

Monotonic Calibrated Interpolated Look-Up Tables

Real-world machine learning applications may require functions to be fast-to-evaluate and interpretable, in particular, guaranteed monotonicity of the learned function can be critical to user trust. We propose meeting these goals for low-dimensional machine learning problems by learning flexible, monotonic functions using calibrated interpolated look-up tables. We extend the structural risk min...

متن کامل

Learning Noisy Linear Threshold Functions

This papers describes and analyzes algorithms for learning linear threshold function (LTFs) in the presence of classiication noise and monotonic noise. When there is classiication noise, each randomly drawn example is mislabeled (i.e., diiers from the target LTF) with the same probability. For monotonic noise, the probability of mis-labeling an example monotonically decreases with the separatio...

متن کامل

Monotonic Networks

Monotonicity is a constraint which arises in many application domains. We present a machine learning model, the monotonic network, for which monotonicity can be enforced exactly, i.e., by virtue offunctional form . A straightforward method for implementing and training a monotonic network is described. Monotonic networks are proven to be universal approximators of continuous, differentiable mon...

متن کامل

UTA - NM: Explaining Stated Preferences with Additive Non-Monotonic Utility Functions

UTA methods use linear programming techniques for finding additive utility functions that best explain stated preferences. However, most UTA methods including the popular UTA-Star are limited to monotonic preferences. UTA-NM (Non Monotonic) is inspired by UTA Star but allows non-monotonic partial utility functions if they decrease total model error. The shape of the utility functions is determi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2004